Literaturnachweis - Detailanzeige
Autor/inn/en | Jacobson, Miriam R.; Whyte, Cristina E.; Azzam, Tarek |
---|---|
Titel | Using Crowdsourcing to Code Open-Ended Responses: A Mixed Methods Approach |
Quelle | In: American Journal of Evaluation, 39 (2018) 3, S.413-429 (17 Seiten)Infoseite zur Zeitschrift
PDF als Volltext |
Sprache | englisch |
Dokumenttyp | gedruckt; online; Zeitschriftenaufsatz |
ISSN | 1098-2140 |
DOI | 10.1177/1098214017717014 |
Schlagwörter | Mixed Methods Research; Evaluation Methods; Comparative Analysis; Feedback (Response); Coding; Questioning Techniques; Collaborative Writing; Electronic Publishing; Data Collection; Online Surveys; Evaluators |
Abstract | Evaluators can work with brief units of text-based data, such as open-ended survey responses, text messages, and social media postings. Online crowdsourcing is a promising method for quantifying large amounts of text-based data by engaging hundreds of people to categorize the data. To further develop and test this method, individuals were recruited through online crowdsourcing to code open-ended survey responses, using a predetermined list of thematic codes that were derived from the responses. The study compared the coding results obtained from online crowdsourcing with coding results obtained from researcher coders. Additionally, the study examined feedback from the crowdsourced coders about their experiences with the task. The results suggested that online crowdsourcing can produce comparable results to researcher coding, but that the comparability of the results may differ across codes. This method may increase the efficiency of quantifying text-based data and provide evaluators with valuable feedback on their coding schemes. (As Provided). |
Anmerkungen | SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://sagepub.com |
Erfasst von | ERIC (Education Resources Information Center), Washington, DC |
Update | 2020/1/01 |